1,059 research outputs found

    Adaptive realization of a maximum likelihood time delay estimator

    Get PDF
    Journal ArticleABSTRACT This paper presents an adaptive maximum likelihood method for estimating the time difference of arrival of a source signal at two spatially separate sensors. It is well-known that the maximum likelihood technique achieves the Cramer-Rao lower bound for time delay estimation error for certain signal conditions. The a-ÎČ tracker is a heuristic mechanism that is heavily used in target tracking applications. In this work, we combine an adaptive realization of the maximum likelihood time delay estimator with the a-ÎČ tracker to obtain significant improvement in the performance of the tracker. Experimental results showing 2 to 8 dB improvement in the mean-square estimation error over the conventional a-ÎČ tracker for various signal-to-noise ratios are also included in the paper

    Vector quantization using the L∞ distortion measure

    Get PDF
    Journal ArticleAbstract-This paper considers vector quantization of signals using the L1 distortion measure. The key contribution is a result that allows one to characterize the centroid of a set of vectors for the L1 distortion measure. A method similar to the Linde-Buzo-Gray (LBG) algorithm for designing codebooks has been developed and tested. The paper also discusses the design of vector quantizers employing the L1 distortion measure in an application in which the occurrences of quantization errors with larger magnitudes than a preselected threshold must be minimized

    An analytical model of the perceptual threshold function for multichannel image compression

    Get PDF
    Journal ArticleABSTRACT The human observer is often the final judge of the quality of compressed images. One way to design a compression system that attempts to reduce or eliminate subjective distortions in the coded images is to incorporate a perceptual threshold function model into the compression system. The perceptual threshold function describes the amount of quantization error that can be introduced into a particular component of the image without introducing any visual distortions. This paper describes an analytical approach for the determination of the perceptual threshold values for use in an arbitrary multichannel image compression system. Experimental results obtained from a compression system that incorporates the perceptual threshold function are also included in the paper

    Perceptually lossless image compression

    Get PDF
    Journal ArticleThis paper presents an algorithm for perceptually lossless image compression. A compressed image is said to be perceptually lossless for a specified viewing distance if the reconstructed image and the original image appear identical to human observers when viewed from the specified distance. Our approach utilizes properties of the human visual system in the form of a perceptual threshold function model to determine the amount of distortion that can be introduced at each location of the image. Constraining all quantization errors to be below the perceptual threshold function results in perceptually lossless image compression. The compression system employs a modified form of the embedded zerotree wavelet coding algorithm to limit the quantization errors below the levels specified by the model of the threshold function. Experimental results demonstrate perceptually lossless compression of monochrome images at bit rates ranging from 0.4 to 1.2 per pixel at a viewing distance of six times the image height. These results were obtained using a simple, empirical model of the perceptual threshold function which included threshold elevations for the local brightness and local energy in neighboring frequency bands

    Perceptually lossless image compression

    Get PDF
    Journal ArticleThere are many instances in which human observers are the final judges of the quality of a compressed image. In such situations, it is useful to incorporate a model of the human visual system (HVS) into the image compression system in order to reduce or even eliminate the visual distortions introduced to the reconstructed image. A compressed image is said to be perceptually lossless for a specified viewing distance if the reconstructed image and the original image appear identical to human observers when viewed from that distance. It is important to recognize that the perceptual quality of an image is a function of the viewing distance, and consequently the notion of perceptually lossless compression is also a function of the viewing distance. This paper presents an algorithm for perceptually lossless image compression. Our approach utilizes properties of the human visual system in the form of a perceptual threshold function (PTF) model [l]. The perceptual threshold function model determines the amount of distortion that can be introduced at each location of the image. Thus, constraining all quantization errors to levels below the PTF results in perceptually lossless image compression. Our system employs a modified form of the embedded zerotree wavelet (EZW) coding algorithm [2] that limits the quantization errors of the wavelet transform coefficients to levels below those specified by the model of the perceptual threshold function. The perceptual threshold function was obtained for the wavelet decomposition employed in the EZW algorithm experimentally in a manner similar to that described in [l]. Modifications made to the EZW algorithm include stopping the coding process as soon as all the wavelet transform coefficients are coded to levels within the threshold values suggested by the PTF. Experimental results demonstrate perceptually lossless compression of monochrome images at bit rates ranging from 0.4 to 1.2 bits per pixel at a viewing distance of six times the image height and at bit rates from 0.2 to 0.5 bits per pixel at a viewing distance of twelve times the image height

    ‘Smart Cities’ – Dynamic Sustainability Issues and Challenges for ‘Old World’ Economies: A Case from the United Kingdom

    Get PDF
    The rapid and dynamic rate of urbanization, particularly in emerging world economies, has resulted in a need to ïŹnd sustainable ways of dealing with the excessive strains and pressures that come to bear on existing infrastructures and relationships. Increasingly during the twenty-ïŹrst century policy makers have turned to technological solutions to deal with this challenge and the dynamics inherent within it. This move towards the utilization of technology to underpin infrastructure has led to the emergence of the term ‘Smart City’. Smart cities incorporate technology based solutions in their planning development and operation. This paper explores the organizational issues and challenges facing a post-industrial agglomeration in the North West of England as it attempted to become a ‘Smart City’. In particular the paper identiïŹes and discusses the factors that posed signiïŹcant challenges for the dynamic relationships residents, policymakers and public and private sector organizations and as a result aims to use these micro-level issues to inform the macro-debate and context of wider Smart City discussions. In order to achieve this, the paper develops a range of recommendations that are designed to inform Smart City design, planning and implementation strategies

    A Farewell to Goodbyes: Reconciling the Past in Cheever’s “Goodbye, My Brother”

    Get PDF
    La problĂ©matique de « Goodbye, My Brother », de John Cheever, s’articule autour du conflit entre le narrateur et son frĂšre, Lawrence, dans le rapport que chacun des deux entretient avec le passĂ©. La rancƓur de Lawrence Ă©mane de l’instabilitĂ© d’un passĂ© qui s’effrite. ComposĂ©e d’une longue sĂ©rie d’au revoir, sa vie le dĂ©tache de tout ce qui n’est pas Ă  l’instar de ses critĂšres d’excellence. En effet, la prĂ©sence de Lawrence Ă  la rĂ©union de famille se rĂ©vĂšle progressivement ancrĂ©e dans un but prĂ©cis : revendre les parts de sa maison et leur dire au revoir Ă  tout jamais. Le narrateur, quant Ă  lui, entreprend une analyse de la psychologie de son frĂšre, issue d’une autre perception du passĂ© axĂ©e sur l’histoire. A travers cette structure, l’attitude de Lawrence est prĂ©sentĂ©e comme produit du passĂ©, en allant de l’évĂšnement historique de la disparition en mer de son pĂšre Ă  la gĂ©nĂ©alogie puritaine de la famille Pommeroy. Tandis que Lawrence ne parvient pas Ă  percevoir les rĂ©percussions des forces historiques qui s’exercent sur lui, Cheever renverse judicieusement l’analyse du narrateur sur elle-mĂȘme. Au dĂ©nouement, le narrateur prend conscience que ce n’était pas tant la personne mĂȘme de Lawrence qu’il abhorrait en lui que son manque de clairvoyance pour dĂ©chiffrer cette histoire Ă  long terme ; aussi, se rĂ©concilie-t-il avec le passĂ©
    • 

    corecore